An Inequality for the Sum of Independent Bounded Random Variables

نویسندگان

چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On the generalization of Trapezoid Inequality for functions of two variables with bounded variation and applications

In this paper, a generalization of trapezoid inequality for functions of two independent variables with bounded variation and some applications are given.

متن کامل

Maximizing the Entropy of a Sum of Independent Random Variables

Let X1; : : : ;Xn be n independent, symmetric random variables supported on the interval [-1,1] and let Sn = Pn i=1Xi be their sum. We show that the di erential entropy of Sn is maximized when X1; : : : ;Xn 1 are Bernoulli taking on +1 or -1 with equal probability and Xn is uniformly distributed. This entropy maximization problem is due to Shlomo Shamai [1] who also conjectured the solution1.

متن کامل

A Comparison Inequality for Sums of Independent Random Variables∗

We give a comparison inequality that allows one to estimate the tail probabilities of sums of independent Banach space valued random variables in terms of those of independent identically distributed random variables. More precisely, let X1, . . . , Xn be independent Banach-valued random variables. Let I be a random variable independent of X1, . . . , Xn and uniformly distributed over {1, . . ....

متن کامل

On the Maximum Partial Sum of Independent Random Variables.

this becomes false if (BI), (Be) and (B) are replaced by (Nt), (NM) and (N), respectively. This follows, even for p = 2 = q, from the above example proving that (NW) is not linear. Correspondingly, (Ne) cannot be interpreted as the dual space of (NP), since such an interpretation would involve the definition of a scalar product. 7. Let (Nt) denote the space which relates to the space (Nt) in th...

متن کامل

An entropy inequality for symmetric random variables

We establish a lower bound on the entropy of weighted sums of (possibly dependent) random variables (X1, X2, . . . , Xn) possessing a symmetric joint distribution. Our lower bound is in terms of the joint entropy of (X1, X2, . . . , Xn). We show that for n ≥ 3, the lower bound is tight if and only if Xi’s are i.i.d. Gaussian random variables. For n = 2 there are numerous other cases of equality...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Theoretical Probability

سال: 2012

ISSN: 0894-9840,1572-9230

DOI: 10.1007/s10959-012-0460-1